23 - Interventional Medical Image Processing (IMIP) 2012 [ID:2304]
50 von 482 angezeigt

The following content has been provided by the University of Erlangen-Nürnberg.

This summer semester we talk about medical image processing for an interventional environment

and we did some math. What type of math did we learn? We learned about the SVD, we learned

about homogenous coordinates, we learned about projection models, projection models using

three by four matrices, if you remember that. And we had one refresher course on a topic

that some of you have seen already in math lectures and that was the basics of variational

calculus. Where we basically have considered the problem, we have an integral of x1, x2,

of a function dependent on x, f and f prime and this has to be optimized with respect

to the constraints subject to f of x1 is f1 and f of x2 is f2. Good. So we know how these

things work. If I ask you SVD what do you answer? What's important to know? What is

the SVD? No, it's not the eigenvalue decomposition but the singular value decomposition. So the

values are what? The singular values. So it's m is u, sigma v transposed, u and v are what?

Type of matrices, also normal matrices so they are basically rotations and the sigma

is a diagonal matrix doing nothing else but a scaling of the various dimensions, right?

Homogeneous coordinates, how do you explain to someone what homogeneous coordinates are?

Matthias? Why should I do that? Okay, perspective projection can be rewritten in terms of matrix

multiplication and matrix calculus. Projection models, what do you explain if somebody asks

you about projection models? Which types of projections do you know? Julia? You did, sorry.

Yes, what is the idea of orthogonal projection? Forget about the z component basically or

the boyfriend who disappears and does not shrink in size, that's orthogonal projection.

Go away and he walks and walks and he's always the same size and it's orthogonal projection.

Life projection? Yeah, it's showing up in the denominator, that means the far away objects

are the smaller they appear. So if your boyfriend is disappearing he's getting smaller and smaller

and smaller, okay? David, what is the difference between variational calculus and parameter

estimation? Yes. No. Parametric models or parameter estimation depends on parametric

models. You have to estimate parameters based on observations. Usually you take an objective

function, you compute the partial derivatives and the points where the partial derivative

vanishes to get the optimal parameter values. In variational calculus we leave the space

of parameters and consider functions. Functions f, f prime, we have an objective function

and we look for functions that optimize this functional and necessary condition for parameter

estimation is the gradient has to be zero and for variational calculus the Euler Lagrangian

partial differential equation has to be fulfilled and the Euler Lagrangian partial differential

equation is Katya, Katya, Katya, the Euler Lagrange. Let me think. Maybe. Okay, so the

Euler Lagrange equation is the Euler Lagrange equation. Okay, is that right? That's right.

I hope that's right. So we have to solve this. And we have applied all these things now,

all these tough math topics, we have applied them to different problem domains in medical

imaging. So just to make the focus of this lecture clear, just to bring the focus of

the lecture below average of the lectures of the technical faculty here. What did we

do? We looked first on a few basic operators, how to do image processing. I remember your

first comment saying we are doing only the complicated stuff here, it must be very easy

to detect edges. Of course there are simple algorithms but they are 30 years old and not

as robust as the modern ones. And we learned about the structure tensor which is nothing

else but the covariance matrix of the gradient vector in a local environment of a considered

image point. And this covariance matrix has certain properties. If we have a flat region,

the two singular values or eigenvalues of the covariance matrix are zero. If we have an edge,

one lambda is larger than zero and the other one is close to zero. And if we have a corner,

two eigenvalues are similar to each other and clearly different from the zero. So the

structure tensor. Then we considered one problem that is very interesting. We talked about

magnetic navigation and that was the problem of having a catheter and the catheter is in

the vessel system and we have to guide the catheter using an external magnetic field

Zugänglich über

Offener Zugang

Dauer

00:36:33 Min

Aufnahmedatum

2012-07-10

Hochgeladen am

2012-07-16 13:25:43

Sprache

en-US

Einbetten
Wordpress FAU Plugin
iFrame
Teilen